AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Compact and Efficient

# Compact and Efficient

Qwen2.5 0.5b Test Ft
Apache-2.0
Qwen 2.5 0.5B is a compact yet powerful language model, fine-tuned based on Qwen/Qwen2.5-0.5B-Instruct, supporting multiple languages with performance close to the Llama 3.2 1B model.
Large Language Model Transformers Supports Multiple Languages
Q
KingNish
1,004
11
Mermaidstable3b
A compact language model specialized in generating Mermaid flowcharts, capable of efficiently converting code and textual narratives into visual diagrams
Large Language Model Transformers
M
TroyDoesAI
109
6
T5 Small Korean Summarization
A Korean text summarization model based on the T5 architecture, specifically optimized for Korean text to generate concise and accurate summaries.
Text Generation Transformers Korean
T
eenzeenee
123
3
Chinese Legal Electra Base Generator
Apache-2.0
Chinese ELECTRA is a Chinese pre-trained model based on Google's ELECTRA, released by the HIT & iFLYTEK Joint Lab (HFL), featuring a compact structure and superior performance.
Large Language Model Transformers Chinese
C
hfl
18
6
Roberta Ko Small
Apache-2.0
A compact Korean RoBERTa model trained under the LASSL framework, suitable for various Korean natural language processing tasks.
Large Language Model Transformers Korean
R
lassl
17
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase